14 research outputs found

    Parameterized Learning and Distillation with Vortex-encoded Spectral Correlations

    Full text link
    Spectral computational methods leverage modal or nonlocal representations of data, and a physically realized approach to spectral computation pertains to encoded diffraction. Encoded diffraction offers a hybrid approach that pairs analog wave propagation with digital back-end electronics, however the intermediate sensor patterns are correlations rather than linear signal weights, which limits the development of robust and efficient downstream analyses. Here, with vortex encoders, we show that the solution for the signal field from sensor intensity adopts the form of polynomial regression, which is subsequently solved with a learned, linear transformation. This result establishes an analytic rationale for a spectral-methods paradigm in physically realized machine learning systems. To demonstrate this paradigm, we quantify the learning that is transferred with an image basis using speckle parameters, Singular-Value Decomposition Entropy (HSVDH_{SVD}) and Speckle-Analogue Density (SAD). We show that HSVDH_{SVD}, a proxy for image complexity, indicates the rate at which a model converges. Similarly, SAD, an averaged spatial frequency, marks a threshold for structurally similar reconstruction. With a vortex encoder, this approach with parameterized training may be extended to distill features. In fact, with images reconstructed with our models, we achieve classification accuracies that rival decade-old, state-of-the-art computer algorithms. This means that the process of learning compressed spectral correlations distills features to aid image classification, even when the goal images are feature-agnostic speckles. Our work highlights opportunities for analytic and axiom-driven machine-learning designs appropriate for real-time applications.Comment: Code is available at: https://github.com/altaiperry/Reconstruction23_Perr
    corecore